Goto

Collaborating Authors

 pseudo-extended markov chain monte carlo


NeurIPS 2019: Pseudo-Extended Markov chain Monte Carlo (paper ID: 2415) 1 We would like to thank the reviewers for dedicating their time to review our paper and the helpful feedback they have

Neural Information Processing Systems

All of the reviewers' minor comments and corrections have been added to Below, we address the reviewers' main questions. The paper focuses on HMC sampling. Unfortunately, HMC can't be applied in the discrete setting due to discontinuous How do you recommend setting π and g to best estimate β? Therefore, it's quite straightforward to implement pseudo-extended HMC within Stan by As a minor comment in line 58, it would be good to state that delta is an arbitrary differentiable function. This is a good point and we've corrected this in the paper. The experiments in 4.1 and 4.2 use the RMSE error of the target variables which is quite unusual.


NeurIPS 2019: Pseudo-Extended Markov chain Monte Carlo (paper ID: 2415) 1 We would like to thank the reviewers for dedicating their time to review our paper and the helpful feedback they have

Neural Information Processing Systems

All of the reviewers' minor comments and corrections have been added to Below, we address the reviewers' main questions. The paper focuses on HMC sampling. Unfortunately, HMC can't be applied in the discrete setting due to discontinuous How do you recommend setting π and g to best estimate β? Therefore, it's quite straightforward to implement pseudo-extended HMC within Stan by As a minor comment in line 58, it would be good to state that delta is an arbitrary differentiable function. This is a good point and we've corrected this in the paper. The experiments in 4.1 and 4.2 use the RMSE error of the target variables which is quite unusual.


Reviews: Pseudo-Extended Markov chain Monte Carlo

Neural Information Processing Systems

Update: I have read the author response and am satisfied with the commitment to elaborate on \beta and \pi and to simplify the Stan PE code with a "pseudo-extended" function. This paper presents a new MCMC sampling method called pseudo-extended MCMC that uses an instrumental distribution to projects the data into a higher-dimensional space where the modes are connected, making it easier for the sampler to mix. A default instrumental distribution based on tempering is provided. The method is compared to existing baselines showing efficacy on three benchmark datasets. The paper is well-placed within the existing literature.


Reviews: Pseudo-Extended Markov chain Monte Carlo

Neural Information Processing Systems

Reviewers reached consensus that the paper makes a valuable contribution for MCMC. There are specific suggestions for improving the experiments that we ask the authors to seriously consider.


Pseudo-Extended Markov chain Monte Carlo

Neural Information Processing Systems

Sampling from posterior distributions using Markov chain Monte Carlo (MCMC) methods can require an exhaustive number of iterations, particularly when the posterior is multi-modal as the MCMC sampler can become trapped in a local mode for a large number of iterations. In this paper, we introduce the pseudo-extended MCMC method as a simple approach for improving the mixing of the MCMC sampler for multi-modal posterior distributions. On the extended space, the modes of the posterior are connected, which allows the MCMC sampler to easily move between well-separated posterior modes. We demonstrate that the pseudo-extended approach delivers improved MCMC sampling over the Hamiltonian Monte Carlo algorithm on multi-modal posteriors, including Boltzmann machines and models with sparsity-inducing priors.


Pseudo-Extended Markov chain Monte Carlo

Nemeth, Christopher, Lindsten, Fredrik, Filippone, Maurizio, Hensman, James

Neural Information Processing Systems

Sampling from posterior distributions using Markov chain Monte Carlo (MCMC) methods can require an exhaustive number of iterations, particularly when the posterior is multi-modal as the MCMC sampler can become trapped in a local mode for a large number of iterations. In this paper, we introduce the pseudo-extended MCMC method as a simple approach for improving the mixing of the MCMC sampler for multi-modal posterior distributions. On the extended space, the modes of the posterior are connected, which allows the MCMC sampler to easily move between well-separated posterior modes. We demonstrate that the pseudo-extended approach delivers improved MCMC sampling over the Hamiltonian Monte Carlo algorithm on multi-modal posteriors, including Boltzmann machines and models with sparsity-inducing priors. Papers published at the Neural Information Processing Systems Conference.